63 research outputs found

    Automatic Construction of Predictive Neuron Models through Large Scale Assimilation of Electrophysiological Data.

    Get PDF
    We report on the construction of neuron models by assimilating electrophysiological data with large-scale constrained nonlinear optimization. The method implements interior point line parameter search to determine parameters from the responses to intracellular current injections of zebra finch HVC neurons. We incorporated these parameters into a nine ionic channel conductance model to obtain completed models which we then use to predict the state of the neuron under arbitrary current stimulation. Each model was validated by successfully predicting the dynamics of the membrane potential induced by 20-50 different current protocols. The dispersion of parameters extracted from different assimilation windows was studied. Differences in constraints from current protocols, stochastic variability in neuron output, and noise behave as a residual temperature which broadens the global minimum of the objective function to an ellipsoid domain whose principal axes follow an exponentially decaying distribution. The maximum likelihood expectation of extracted parameters was found to provide an excellent approximation of the global minimum and yields highly consistent kinetics for both neurons studied. Large scale assimilation absorbs the intrinsic variability of electrophysiological data over wide assimilation windows. It builds models in an automatic manner treating all data as equal quantities and requiring minimal additional insight

    Effect of size and configuration on the magnetization of nickel dot arrays

    Get PDF

    Negative differential resistance in graphite-silicone polymer composites

    Get PDF

    Inhibition Delay Increases Neural Network Capacity through Stirling Transform

    Get PDF
    Inhibitory neural networks are found to encode high volumes of information through delayed inhibition. We show that inhibition delay increases storage capacity through a Stirling transform of the minimum capacity which stabilizes locally coherent oscillations. We obtain both the exact and asymptotic formulas for the total number of dynamic attractors. Our results predict a (ln 2) −N -fold increase in capacity for an N-neuron network and demonstrate high-density associative memories which host a maximum number of oscillations in analog neural devices

    Direct pressure sensing with carbon nanotubes grown in a micro-cavity

    Get PDF

    Experimental observation of multi-stability and dynamic attractors in silicon central pattern generators

    Get PDF

    Noise-activated barrier crossing in multi-attractor spiking networks

    Get PDF
    • …
    corecore